首页> 外文OA文献 >AdaBoost with SVM-based component classifiers
【2h】

AdaBoost with SVM-based component classifiers

机译:AdaBoost与基于SVM的组件分类器

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The use of SVM (Support Vector Machine) as component classifier in AdaBoost may seem like going against the grain of the Boosting principle since SVM is not an easy classifier to train. Moreover, Wickramaratna et al. [2001. Performance degradation in boosting. In: Proceedings of the Second International Workshop on Multiple Classifier Systems, pp. 11-21] show that AdaBoost with strong component classifiers is not viable. In this paper, we shall show that AdaBoost incorporating properly designed RBFSVM (SVM with the RBF kernel) component classifiers, which we call AdaBoostSVM, can perform as well as SVM. Furthermore, the proposed AdaBoostSVM demonstrates better generalization performance than SVM on imbalanced classification problems. The key idea of AdaBoostSVM is that for the sequence of trained RBFSVM component classifiers, starting with large s values (implying weak learning), the s values are reduced progressively as the Boosting iteration proceeds. This effectively produces a set of RBFSVM component classifiers whose model parameters are adaptively different manifesting in better generalization as compared to AdaBoost approach with SVM component classifiers using a fixed (optimal) s value. From benchmark data sets, we show that our AdaBoostSVM approach outperforms other AdaBoost approaches using component classifiers such as Decision Trees and Neural Networks. AdaBoostSVM can be seen as a proof of concept of the idea proposed in Valentini and Dietterich [2004. Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods. Journal of Machine Learning Research 5, 725-775] that Adaboost with heterogeneous SVMs could work well. Moreover, we extend AdaBoostSVM to the Diverse AdaBoostSVM to address the reported accuracy/diversity dilemma of the original Adaboost. By designing parameter adjusting strategies, the distributions of accuracy and diversity over RBFSVM component classifiers are tuned to maintain a good balance between them and promising results have been obtained on benchmark data sets.
机译:在AdaBoost中使用SVM(支持向量机)作为组件分类器似乎违反了Boosting原理,因为SVM并非易于训练的分类器。此外,Wickramaratna等。 [2001。助推器性能下降。在:第二届国际多分类器系统研讨会论文集,第11-21页]中显示,具有强大成分分类器的AdaBoost不可行。在本文中,我们将证明AdaBoost结合正确设计的RBFSVM(带有RBF内核的SVM)组件分类器(我们称为AdaBoostSVM)可以实现与SVM一样好的性能。此外,在不平衡分类问题上,提出的AdaBoostSVM表现出比SVM更好的泛化性能。 AdaBoostSVM的关键思想是,对于训练有素的RBFSVM组件分类器的序列,从大s值开始(暗示弱学习),随着Boosting迭代的进行,s值逐渐减小。与使用具有固定(最佳)s值的SVM组件分类器的AdaBoost方法相比,这有效地产生了一组RBFSVM组件分类器,其模型参数具有自适应性,表现出更好的概括性。从基准数据集可以看出,我们的AdaBoostSVM方法优于使用决策树和神经网络等组件分类器的其他AdaBoost方法。 AdaBoostSVM可以看作是Valentini和Dietterich [2004年]提出的想法的概念证明。支持向量机的偏差方差分析,用于基于SVM的集成方法的开发。机器学习研究杂志5,725-775]带有异构SVM的Adaboost可以很好地工作。此外,我们将AdaBoostSVM扩展到多样化的AdaBoostSVM,以解决原始Adaboost报告的准确性/多样性难题。通过设计参数调整策略,可以调整RBFSVM组件分类器的准确性和多样性分布,以在它们之间保持良好的平衡,并且在基准数据集上获得了可喜的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号